Image-assisted modeling from sketches
نویسندگان
چکیده
In this paper, we propose a method for creating freeform surfaces from sketch-annotated images. Beginning from an image, the user sketches object boundaries, features, and holes. Sketching is made easier by a magnetic pen that follows strong edges in the image. To create a surface from the sketch, a planar mesh is constructed such that its geometry aligns with the boundary and interior features. We then inflate to 3D using a discrete distance transform filtered through a cross-sectional mapping function. Finally, the input image is applied as a texture to the surface. The benefits of our framework are demonstrated with examples in modeling both freeform and manufactured objects.
منابع مشابه
Data augmentation-assisted deep learning of hand-drawn partially colored sketches for visual search
In recent years, image databases are growing at exponential rates, making their management, indexing, and retrieval, very challenging. Typical image retrieval systems rely on sample images as queries. However, in the absence of sample query images, hand-drawn sketches are also used. The recent adoption of touch screen input devices makes it very convenient to quickly draw shaded sketches of obj...
متن کاملMarrNet: 3D Shape Reconstruction via 2.5D Sketches
3D object reconstruction from a single image is a highly under-determined problem, requiring strong prior knowledge of plausible 3D shapes. This introduces challenges for learning-based approaches, as 3D object annotations are scarce in real images. Previous work chose to train on synthetic data with ground truth 3D information, but suffered from domain adaptation when tested on real data. In t...
متن کاملSketchSoup: Exploratory Ideation Using Design Sketches
A hallmark of early stage design is a number of quick-and-dirty sketches capturing design inspirations, model variations, and alternate viewpoints of a visual concept. We present SketchSoup, a workflow that allows designers to explore the design space induced by such sketches. We take an unstructured collection of drawings as input, along with a small number of user-provided correspondences as ...
متن کاملCollaborative 3D Modeling by the Crowd
We propose a collaborative 3D modeling system that deconstructs the complex 3D modeling process into a collection of simple tasks to be executed by nonprofessional crowd workers. Given a 2D image showing a target object, each crowd worker is directed to draw a simple sketch representing an orthographic view of the object, using their visual cognition and real-world knowledge. The system then sy...
متن کاملLearning Sketch-based 3D Modelling from user’s sketching gestures
To infer three-dimensional models from two-dimensional sketches, most of the existent research focuses on image similarity, requiring a minimum of drawing skills, which is often beyond regular user’s capability. This paper proposes to learn the mapping from the user’s sketch into the three dimensional model by considering the sketch as a set of gestures containing information that denotes the u...
متن کامل